Lecture Nine - STAT 212a

نویسنده

  • Aditya Guntuboyina
چکیده

Let {p θ , θ ∈ Θ} be a family of probability densities with respect to µ where Θ is an open subinterval (finite, infinite or semi-infinite) of the real line. The Fisher Information for this family is defined by: I(θ) := E θ ∂ ∂θ log p θ (X) 2 Note that the densities p θ can be on arbitrary spaces; the only condition for defining I(θ) is that θ is real. The function ∂ ∂θ log p θ (X) is called the score function and thus the Fisher Information is the second moment of the score function. The intuition is that the greater I(θ 0) is, the easier it is to distinguish θ 0 from neighbouring values of θ and, therefore, the more accurately θ can be estimated at θ = θ 0. Note that Fisher information depends on the parametrization. In other words, if θ = h(ξ) and if q ξ denote the density p h(ξ) , then ˜ I(ξ) = I(h(ξ)) (h (ξ)) 2. This follows by chain rule; note that ∂ ∂ξ log p h(ξ) (X) = ∂ ∂θ log p θ (X) θ=h(ξ) h (ξ). Because they are densities, p θ (x)dµ(x) = 1 for all θ ∈ Θ.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Lecture Three - STAT 212a

The mutual information I(θ; X) between two random variables θ and X is defined as the Kullback-Leibler divergence between their joint distribution and the product of their marginal distributions. It is interpreted as the amount of information that X contains about θ.

متن کامل

Lecture Eight - STAT 212a

1 Sudakov Minoration Suppose that (T, ρ) is a finite metric space and let {X t , t ∈ T } be a stochastic process indexed by T satisfying

متن کامل

Lecture Six - STAT 212a

Conversely, for every -packing subset t1, . . . , tn of T , the closed balls B(ti, /2), i = 1, . . . , n are disjoing and hence every /2-cover of T must have one point in each of the balls B(ti, /2). As a result, an /2-cover of T must have at least n points. This implies that M( /2, T ) ≥ N( , T ). Lemma 1.2 (Volumetric Argument). Let T = X denote the ball in R of radius Γ centered at the origi...

متن کامل

Lecture Two - STAT 212a

1 f-divergences Recall from last lecture: Here is the definition of an f-divergence: Let f : (0, ∞) → R be a convex function with f (1) = 0. By virtue of convexity, both the limits f (0) := lim x↓0 f (x) and f (∞) := lim x↑∞ f (x)/x exist, although they may be equal to +∞. For two probability measures P and Q, the f-divergence D f (P ||Q) is defined by D f (P ||Q) := {q > 0} f p q dQ + f (∞)P {...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012